56 research outputs found
Cores of Cooperative Games in Information Theory
Cores of cooperative games are ubiquitous in information theory, and arise
most frequently in the characterization of fundamental limits in various
scenarios involving multiple users. Examples include classical settings in
network information theory such as Slepian-Wolf source coding and multiple
access channels, classical settings in statistics such as robust hypothesis
testing, and new settings at the intersection of networking and statistics such
as distributed estimation problems for sensor networks. Cooperative game theory
allows one to understand aspects of all of these problems from a fresh and
unifying perspective that treats users as players in a game, sometimes leading
to new insights. At the heart of these analyses are fundamental dualities that
have been long studied in the context of cooperative games; for information
theoretic purposes, these are dualities between information inequalities on the
one hand and properties of rate, capacity or other resource allocation regions
on the other.Comment: 12 pages, published at
http://www.hindawi.com/GetArticle.aspx?doi=10.1155/2008/318704 in EURASIP
Journal on Wireless Communications and Networking, Special Issue on "Theory
and Applications in Multiuser/Multiterminal Communications", April 200
Concentration of the information in data with log-concave distributions
A concentration property of the functional is demonstrated,
when a random vector X has a log-concave density f on . This
concentration property implies in particular an extension of the
Shannon-McMillan-Breiman strong ergodic theorem to the class of discrete-time
stochastic processes with log-concave marginals.Comment: Published in at http://dx.doi.org/10.1214/10-AOP592 the Annals of
Probability (http://www.imstat.org/aop/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information
The sumset and inverse sumset theories of Freiman, Pl\"{u}nnecke and Ruzsa,
give bounds connecting the cardinality of the sumset of two discrete sets , to the cardinalities (or the finer
structure) of the original sets . For example, the sum-difference bound of
Ruzsa states that, , where the difference set . Interpreting the differential entropy of a
continuous random variable as (the logarithm of) the size of the effective
support of , the main contribution of this paper is a series of natural
information-theoretic analogs for these results. For example, the Ruzsa
sum-difference bound becomes the new inequality, , for any pair of independent continuous random variables and .
Our results include differential-entropy versions of Ruzsa's triangle
inequality, the Pl\"{u}nnecke-Ruzsa inequality, and the
Balog-Szemer\'{e}di-Gowers lemma. Also we give a differential entropy version
of the Freiman-Green-Ruzsa inverse-sumset theorem, which can be seen as a
quantitative converse to the entropy power inequality. Versions of most of
these results for the discrete entropy were recently proved by Tao,
relying heavily on a strong, functional form of the submodularity property of
. Since differential entropy is {\em not} functionally submodular, in the
continuous case many of the corresponding discrete proofs fail, in many cases
requiring substantially new proof strategies. We find that the basic property
that naturally replaces the discrete functional submodularity, is the data
processing property of mutual information.Comment: 23 page
Conditional R\'enyi entropy and the relationships between R\'enyi capacities
The analogues of Arimoto's definition of conditional R\'enyi entropy and
R\'enyi mutual information are explored for abstract alphabets. These
quantities, although dependent on the reference measure, have some useful
properties similar to those known in the discrete setting. In addition to
laying out some such basic properties and the relations to R\'enyi divergences,
the relationships between the families of mutual informations defined by
Sibson, Augustin-Csisz\'ar, and Lapidoth-Pfister, as well as the corresponding
capacities, are explored.Comment: 17 pages, 1 figur
- β¦